منابع مشابه
On interval-subgradient and no-good cuts
Interval-gradient cuts are (nonlinear) valid inequalities for nonconvex NLPs defined for constraints g(x) ≤ 0 with g being continuously differentiable in a box [x, x̄]. In this paper we define intervalsubgradient cuts, a generalization to the case of nondifferentiable g, and show that no-good cuts (which have the form ‖x−x̂‖ ≥ ε for some norm and positive constant ε) are a special case of interva...
متن کاملRe-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method
We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solvin...
متن کاملThree Cuts for Accelerated Interval Propagation
This paper addresses the problem of nonlinear multivariate root nding. In an earlier paper we describe a system called Newton which nds roots of systems of nonlinear equations using re nements of interval methods. The re nements are inspired by AI constraint propagation techniques. Newton is competitive with continuation methods on most benchmarks and can handle a variety of cases that are infe...
متن کاملA note on "An interval type-2 fuzzy extension of the TOPSIS method using alpha cuts"
The technique for order of preference by similarity to ideal solution (TOPSIS) is a method based on the ideal solutions in which the most desirable alternative should have the shortest distance from positive ideal solution and the longest distance from negative ideal solution. Depending on type of evaluations or method of ranking, different approaches have been proposing to calculate distances ...
متن کاملOn Subgradient Projectors
The subgradient projector is of considerable importance in convex optimization because it plays the key role in Polyak’s seminal work — and the many papers it spawned — on subgradient projection algorithms for solving convex feasibility problems. In this paper, we offer a systematic study of the subgradient projector. Fundamental properties such as continuity, nonexpansiveness, and monotonicity...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Operations Research Letters
سال: 2010
ISSN: 0167-6377
DOI: 10.1016/j.orl.2010.05.010